Iteration Complexity of Fixed-Step Methods by Nesterov and Polyak for Convex Quadratic Functions

نویسندگان

چکیده

Abstract This note considers the momentum method by Polyak and accelerated gradient Nesterov, both without line search but with fixed step length applied to strongly convex quadratic functions assuming that exact gradients are used appropriate upper lower bounds for extreme eigenvalues of Hessian matrix known. Simple 2-d-examples show Euclidean distance iterates optimal solution is non-monotone. In this context, an explicit bound derived on number iterations needed guarantee a reduction factor $$\epsilon $$ ϵ . For methods, up constant factor, it complements earlier asymptotically results method, establishes another link Nesterov’s method.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Iteration Complexity of Feasible Descent Methods Iteration Complexity of Feasible Descent Methods for Convex Optimization

In many machine learning problems such as the dual form of SVM, the objective function to be minimized is convex but not strongly convex. This fact causes difficulties in obtaining the complexity of some commonly used optimization algorithms. In this paper, we proved the global linear convergence on a wide range of algorithms when they are applied to some non-strongly convex problems. In partic...

متن کامل

Iteration-complexity of first-order penalty methods for convex programming

This paper considers a special but broad class of convex programming (CP) problems whose feasible region is a simple compact convex set intersected with the inverse image of a closed convex cone under an affine transformation. It studies the computational complexity of quadratic penalty based methods for solving the above class of problems. An iteration of these methods, which is simply an iter...

متن کامل

Iteration complexity of feasible descent methods for convex optimization

In many machine learning problems such as the dual form of SVM, the objective function to be minimized is convex but not strongly convex. This fact causes difficulties in obtaining the complexity of some commonly used optimization algorithms. In this paper, we proved the global linear convergence on a wide range of algorithms when they are applied to some non-strongly convex problems. In partic...

متن کامل

On the quadratic support of strongly convex functions

In this paper, we first introduce the notion of $c$-affine functions for $c> 0$. Then we deal with some properties of strongly convex functions in real inner product spaces by using a quadratic support function at each point which is $c$-affine. Moreover, a Hyers–-Ulam stability result for strongly convex functions is shown.

متن کامل

A secant-based Nesterov method for convex functions

A simple secant-based fast gradient method is developed for problems whose objective function is convex and well-defined. The proposed algorithm extends the classical Nesterov gradient method by updating the estimate-sequence parameter with secant information whenever possible. This is achieved by imposing a secant condition on the choice of search point. Furthermore, the proposed algorithm emb...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of Optimization Theory and Applications

سال: 2023

ISSN: ['0022-3239', '1573-2878']

DOI: https://doi.org/10.1007/s10957-023-02261-w